5,126 research outputs found

    A Neural Networks Committee for the Contextual Bandit Problem

    Get PDF
    This paper presents a new contextual bandit algorithm, NeuralBandit, which does not need hypothesis on stationarity of contexts and rewards. Several neural networks are trained to modelize the value of rewards knowing the context. Two variants, based on multi-experts approach, are proposed to choose online the parameters of multi-layer perceptrons. The proposed algorithms are successfully tested on a large dataset with and without stationarity of rewards.Comment: 21st International Conference on Neural Information Processin

    Analysis and design of a flat central finned-tube radiator

    Get PDF
    Computer program based on fixed conductance parameter yields minimum weight design. Second program employs variable conductance parameter and variable ratio of fin length to tube outside radius, and is used for radiator designs with geometric limitations. Major outputs of the two programs are given

    Spinodal fractionation in a polydisperse square well fluid

    Full text link
    Using Kinetic Monte Carlo simulation, we model gas-liquid spinodal decomposition in a size-polydisperse square well fluid, representing a 'near-monodisperse' colloidal dispersion. We find that fractionation (demixing) of particle sizes between the phases begins asserting itself shortly after the onset of phase ordering. Strikingly, the direction of size fractionation can be reversed by a seemingly trivial choice between two inter-particle potentials which, in the monodisperse case, are identical -- we rationalise this in terms of a perturbative, equilibrium theory of polydispersity. Furthermore, our quantitative results show that Kinetic Monte Carlo simulation can provide detailed insight into the role of fractionation in real colloidal systems.Comment: 7 pages, 7 figures, to be published in Phys. Rev.

    Bootstrapping Monte Carlo Tree Search with an Imperfect Heuristic

    Full text link
    We consider the problem of using a heuristic policy to improve the value approximation by the Upper Confidence Bound applied in Trees (UCT) algorithm in non-adversarial settings such as planning with large-state space Markov Decision Processes. Current improvements to UCT focus on either changing the action selection formula at the internal nodes or the rollout policy at the leaf nodes of the search tree. In this work, we propose to add an auxiliary arm to each of the internal nodes, and always use the heuristic policy to roll out simulations at the auxiliary arms. The method aims to get fast convergence to optimal values at states where the heuristic policy is optimal, while retaining similar approximation as the original UCT in other states. We show that bootstrapping with the proposed method in the new algorithm, UCT-Aux, performs better compared to the original UCT algorithm and its variants in two benchmark experiment settings. We also examine conditions under which UCT-Aux works well.Comment: 16 pages, accepted for presentation at ECML'1

    W49A North - Global or Local or No Collapse?

    Full text link
    We attempt to fit observations with 5" resolution of the J=2-1 transition of CS in the directions of H II regions A, B, and G of W49A North as well as observations with 20" resolution of the J=2-1, 3-2, 5-4, and 7-6 transitions in the directions of H II regions A and G by using radiative transfer calculations. These calculations predict the intensity profiles resulting from several spherical clouds along the line of sight. We consider three models: global collapse of a very large (5 pc radius) cloud, localized collapse from smaller (1 pc) clouds around individual H II regions, and multiple, static clouds. For all three models we can find combinations of parameters that reproduce the CS profiles reasonably well provided that the component clouds have a core-envelope structure with a temperature gradient. Cores with high temperature and high molecular hydrogen density are needed to match the higher transitions (e.g. J=7-6) observed towards A and G. The lower temperature, low density gas needed to create the inverse P-Cygni profile seen in the CS J=2-1 line (with 5" beam) towards H II region G arises from different components in the 3 models. The infalling envelope of cloud G plus cloud B creates the absorption in global collapse, cloud B is responsible in local collapse, and a separate cloud, G', is needed in the case of many static clouds. The exact nature of the velocity field in the envelopes for the case of local collapse is not important as long as it is in the range of 1 to 5 km/s for a turbulent velocity of about 6 km/s. High resolution observations of the J=1-0 and 5-4 transitions of CS and C34S may distinguish between these three models. Modeling existing observations of HCO+ and C18O does not allow one to distinguish between the three models but does indicate the existence of a bipolar outflow.Comment: 42 pages, 27 figures, accepted for publication in the ApJS August 2004, v153 issu

    The equilibrium intrinsic crystal-liquid interface of colloids

    Full text link
    We use confocal microscopy to study an equilibrated crystal-liquid interface in a colloidal suspension. Capillary waves roughen the surface, but locally the intrinsic interface is sharply defined. We use local measurements of the structure and dynamics to characterize the intrinsic interface, and different measurements find slightly different widths of this interface. In terms of the particle diameter dd, this width is either 1.5d1.5d (based on structural information) or 2.4d2.4d (based on dynamics), both not much larger than the particle size. This work is the first direct experimental visualization of an equilibrated crystal-liquid interface.Comment: 6 pages; revised version, submitted to PNA

    Oblivion: Mitigating Privacy Leaks by Controlling the Discoverability of Online Information

    Get PDF
    Search engines are the prevalently used tools to collect information about individuals on the Internet. Search results typically comprise a variety of sources that contain personal information -- either intentionally released by the person herself, or unintentionally leaked or published by third parties, often with detrimental effects on the individual's privacy. To grant individuals the ability to regain control over their disseminated personal information, the European Court of Justice recently ruled that EU citizens have a right to be forgotten in the sense that indexing systems, must offer them technical means to request removal of links from search results that point to sources violating their data protection rights. As of now, these technical means consist of a web form that requires a user to manually identify all relevant links upfront and to insert them into the web form, followed by a manual evaluation by employees of the indexing system to assess if the request is eligible and lawful. We propose a universal framework Oblivion to support the automation of the right to be forgotten in a scalable, provable and privacy-preserving manner. First, Oblivion enables a user to automatically find and tag her disseminated personal information using natural language processing and image recognition techniques and file a request in a privacy-preserving manner. Second, Oblivion provides indexing systems with an automated and provable eligibility mechanism, asserting that the author of a request is indeed affected by an online resource. The automated ligibility proof ensures censorship-resistance so that only legitimately affected individuals can request the removal of corresponding links from search results. We have conducted comprehensive evaluations, showing that Oblivion is capable of handling 278 removal requests per second, and is hence suitable for large-scale deployment

    STATISTICAL ISSUES IN NEXT-GENERATION SEQUENCING

    Get PDF
    High throughput deep-sequencing or next-generation sequencing has emerged as an exciting new tool in a great number of applications (e.g., variant discovery, profiling of histone modifications, identifying transcription factor binding sites, resequencing, and transcriptome characterization). Even though this technology has generated unprecedented amounts of data in the scientific community few studies have looked carefully at its inherent variability. Recent studies of mRNA expression levels found little appreciable technical variation in Illumina’s Solexa sequencing platform (a next-generation sequencing device). Although these results are encouraging, they are limited to a specific platform and application, and have been made without any attention to experimental design. This paper provides an overview of some key issues in data management and experimental design related to Illumina’s Solexa Genome Analyzer technology

    FUNCTIONAL DIVERGENCE OF DUPLICATED GENES IN THE SOYBEAN GENOME

    Get PDF
    The soybean genome has undergone many different evolutionary changes that are observable with modern technologies. Of particular interest to scientists and plant breeders is the fact that the soybean genome exhibits features of genome duplication from millions of years ago. Genes that were copied during the duplication event have since diverged functionally. Identifying functionally divergent duplicate genes may provide insight into the evolution of soybean. To investigate functional divergence, transcripts from seven different tissue samples of pooled soybean messenger RNA were sequenced using the Solexa next-generation sequencer and analyzed for gene expression. We tested differential expression of duplicated genes within tissue by employing an integrated normalization and statistical testing methodology. Blocks of duplicate genes (i.e., gene sets) were tested for unanimity of over-or under-expression. These same genes were also analyzed for differential expression across tissues. We identified thousands of duplicate genes that displayed differential expression patterns within each tissue. In some cases these genes were over-represented in duplicate blocks, suggestive of functional divergence of a large genomic region
    corecore